AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Parallel corpus fine-tuning

# Parallel corpus fine-tuning

Sentence Transformers Experimental Hubert Hungarian
Apache-2.0
Hungarian sentence embedding model fine-tuned based on the huBERT pre-trained model, specifically designed for sentence similarity tasks
Text Embedding Other
S
NYTK
379
1
Randeng Deltalm 362M Zh En
A Chinese-English translation system fine-tuned on the Deltalm base model within the Fengshen framework, integrating 30 million Chinese-English parallel corpora and 200,000 IWSLT conference data entries
Machine Translation Transformers Supports Multiple Languages
R
IDEA-CCNL
191
9
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase